Aiming at the problem that the key entity information in the police field is difficult to recognize, a neural network model based on BERT (Bidirectional Encoder Representations from Transformers), namely BERT-BiLSTM-Attention-CRF, was proposed to recognize and extract related named entities, in the meantime, the corresponding entity annotation specifications were designed for different cases. In the model ,the BERT pre-trained word vectors were used to replace the word vectors trained by the traditional methods such as Skip-gram and Continuous Bag of Words (CBOW), improving the representation ability of the word vector and solving the problem of word boundary division in Chinese corpus trained by the character vectors. And the attention mechanism was used to improve the architecture of classical Named Entity Recognition (NER) model BiLSTM-CRF. BERT-BiLSTM-Attention-CRF model has an accuracy of 91% on the test set, which is 7% higher than that of CRF++ Baseline, and 4% higher than that of BiLSTM-CRF model. The F1 values of the entities are all higher than 0.87.